AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Optimal training calculation

# Optimal training calculation

Cerebras GPT 2.7B
Apache-2.0
Cerebras-GPT 2.7B is a language model based on the Transformer architecture, aiming to support the research of large language models and can serve as a basic model in fields such as natural language processing.
Large Language Model Transformers English
C
cerebras
269
44
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase